### Rectified Linear Unit (ReLU) Go back to the [[AI Glossary]] An activation function with the following rules: If input is negative or zero, output is 0. If input is positive, output is equal to input.